Skip to content

Interface for solver supporting diff natively#356

Open
blegat wants to merge 24 commits intomasterfrom
bl/native
Open

Interface for solver supporting diff natively#356
blegat wants to merge 24 commits intomasterfrom
bl/native

Conversation

@blegat
Copy link
Copy Markdown
Member

@blegat blegat commented Mar 25, 2026

First commit by Claude Code during our meeting based on this prompt #344 (comment)
Then I took over
Closes #344

@codecov
Copy link
Copy Markdown

codecov bot commented Mar 26, 2026

Codecov Report

❌ Patch coverage is 80.00000% with 13 lines in your changes missing coverage. Please review.
✅ Project coverage is 90.50%. Comparing base (2b4c350) to head (49f824e).
⚠️ Report is 1 commits behind head on master.

Files with missing lines Patch % Lines
src/moi_wrapper.jl 75.00% 7 Missing ⚠️
src/ConicProgram/ConicProgram.jl 50.00% 2 Missing ⚠️
src/NonLinearProgram/NonLinearProgram.jl 75.00% 2 Missing ⚠️
src/QuadraticProgram/QuadraticProgram.jl 50.00% 2 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #356      +/-   ##
==========================================
- Coverage   90.73%   90.50%   -0.24%     
==========================================
  Files          16       16              
  Lines        2300     2339      +39     
==========================================
+ Hits         2087     2117      +30     
- Misses        213      222       +9     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@PTNobel
Copy link
Copy Markdown

PTNobel commented Mar 27, 2026

So I built a draft implementation for Moreau.jl using this branch; it worked really well! Thank you.

A few issues that popped up:

  • We need to decide whether to cache data for the backward pass when we solve the forward problem. There doesn't seem to be any way to know if we're being called from a DiffOpt context or not, so I ended up using the presence of the DiffOpt library as a proxy for whether to store backward gradients.
  • We support differentiating the quadratic term in the objective, and it seemed to me DiffOpt only supported linear sensitivity of the obj? I wasn't sure.
  • Converting from dobj (DiffOpt) to dx (Moreau interface) required us to cache a bunch of data structures like P, x, and q that we otherwise didn't need in the Julia interface.
  • We ended up needing to cache the structure of A on the Julia side to dA and db into dconstraint

Overall, I think a bunch of this is just API differences, and probably can't be fixed at this point, but figured it was worth mentioning.

@blegat
Copy link
Copy Markdown
Member Author

blegat commented Apr 3, 2026

This looks good to merge, let me know if there are any comments

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

4 participants